4 research outputs found

    An evaluation framework and selection tool for education apps usability, with a case study from health education apps

    Get PDF
    Mobile apps for health education are commonly utilised to support different users. The development of these apps is increasing rapidly. A critical evaluation framework is needed to ensure the usability and reliability of Mobile Health Education Applications (MHEAs) to save time and effort for stakeholders. This project aims to assist the evaluation of MHEAs through development of an evaluation framework, which includes suitable metrics, an efficient hybrid utilizing Heuristic Evaluation (HE) and Usability Evaluation (UE). This framework determines the usefulness and usability of MHEAs, in order to improve the software engineering to create more effective ways to evaluate such software. In this framework, the Medical Apps Selection Tool (MAST) has been developed in which performance helps to select suitable MHEAs, assisting stakeholders to choose MHEAs that meet their requirements.The thesis employs two methods to make the evaluation framework capable of performing qualitative and quantitative data analysis. The first is a qualitative method, involving interviews based on proposed selected hybrid metrics from HE and UE, with three kinds of stakeholders: Patients, Health Professionals and Software Developers, to identify specific relevant selected hybrid metrics to measure usability in MHEAs. These metrics are deployed to measure usability in different MHEAs based on ranking these apps within the evaluation framework. These metrics were converted into an evaluation questionnaire, which has been applied to several MHEAs. The second method is the translation of the outcomes form the first method to measure the usability of MHEAs and determine what stakeholders require from using MHEAs. For this purpose, it categorises stakeholders with different needs from MHEAs; this reflected in the MAST, based on matching different stakeholders with different MHEAs.The findings of the study indicate that the evaluation framework is able to evaluate MHEAs and record usability problems. Furthermore, this evaluation framework leads to selection of the most appropriate apps by developing the MAST for stakeholders. The framework is expected to be applicable to other domains and platforms

    A guidance and evaluation approach for mHealth education applications

    Get PDF
    © Springer International Publishing AG 2017. A growing number of mobile applications for health education are being utilized to support different stakeholders, from health professionals to software developers to patients and more general users. There is a lack of a critical evaluation framework to ensure the usability and reliability of these mobile health education applications (MHEAs). Such a framework would facilitate the saving of time and effort for the different user groups. This paper describes a framework for evaluating mobile applications for health education, including a guidance tool to help different stakeholders select the one most suitable for them. The framework is intended to meet the needs and requirements of the different user categories, as well as improving the development of MHEAs through software engineering approaches. A description of the evaluation framework is provided, with its efficient hybrid of selected heuristic evaluation (HE) and usability evaluation (UE) factors. Lastly, an account of the quantitative and qualitative results for the framework applied to the Medscape and other mobile apps is given. This proposed framework - an Evaluation Framework for Mobile Health Education Apps - consists of a hybrid of five metrics selected from a larger set during heuristic and usability evaluation, the choice being based on interviews with patients, software developers and health professionals

    Evaluation of mobile health education applications for health professionals and patients

    Get PDF
    Paper presented at 8th International conference on e-Health (EH 2016), 1-3 July 2016, Funchal, Madeira, Portugal. ABSTRACT Mobile applications for health education are commonly utilized to support patients and health professionals. A critical evaluation framework is required to ensure the usability and reliability of mobile health education applications in order to facilitate the saving of time and effort for the various user groups; thus, the aim of this paper is to describe a framework for evaluating mobile applications for health education. The intended outcome of this framework is to meet the needs and requirements of the different user categories and to improve the development of mobile health education applications with software engineering approaches, by creating new and more effective techniques to evaluate such software. This paper first highlights the importance of mobile health education apps, then explains the need to establish an evaluation framework for these apps. The paper provides a description of the evaluation framework, along with some specific evaluation metrics: an efficient hybrid of selected heuristic evaluation (HE) and usability evaluation (UE) factors to enable the determination of the usefulness and usability of health education mobile apps. Finally, an explanation of the initial results for the framework was obtained using a Medscape mobile app. The proposed framework - An Evaluation Framework for Mobile Health Education Apps – is a hybrid of five metrics selected from a larger set in heuristic and usability evaluation, filtered based on interviews from patients and health professionals. These five metrics correspond to specific facets of usability identified through a requirements analysis of typical users of mobile health apps. These metrics were decomposed into 21 specific questionnaire questions, which are available on request from first author

    Heuristic Evaluation for Serious Immersive Games and M-instruction

    Get PDF
    © Springer International Publishing Switzerland 2016. Two fast growing areas for technology-enhanced learning are serious games and mobile instruction (M-instruction or M-Learning). Serious games are ones that are meant to be more than just entertainment. They have a serious use to educate or promote other types of activity. Immersive Games frequently involve many players interacting in a shared rich and complex-perhaps web-based-mixed reality world, where their circumstances will be multi and varied. Their reality may be augmented and often self-composed, as in a user-defined avatar in a virtual world. M-instruction and M-Learning is learning on the move; much of modern computer use is via smart devices, pads, and laptops. People use these devices all over the place and thus it is a natural extension to want to use these devices where they are to learn. This presents a problem if we wish to evaluate the effectiveness of the pedagogic media they are using. We have no way of knowing their situation, circumstance, education background and motivation, or potentially of the customisation of the final software they are using. Getting to the end user itself may also be problematic; these are learning environments that people will dip into at opportune moments. If access to the end user is hard because of location and user self-personalisation, then one solution is to look at the software before it goes out. Heuristic Evaluation allows us to get User Interface (UI) and User Experience (UX) experts to reflect on the software before it is deployed. The effective use of heuristic evaluation with pedagogical software [1] is extended here, with existing Heuristics Evaluation Methods that make the technique applicable to Serious Immersive Games and mobile instruction (M-instruction). We also consider how existing Heuristic Methods may be adopted. The result represents a new way of making this methodology applicable to this new developing area of learning technology
    corecore